LQR-Based Sparsification Algorithms of Consensus Networks

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ℋ2-clustering of closed-loop consensus networks under a class of LQR design

Given any positive integer r, our objective is to develop a strategy for grouping the states of a n-node network into r ≤ n distinct non-overlapping groups. The criterion for this partitioning is defined as follows. First, a LQR controller is defined for the original n-node network. Then, a r-dimensional reduced-order network is created by imposing a projection matrix P on the n-node open-loop ...

متن کامل

H2-Clustering of Closed-loop Consensus Networks under a Class of LQR Design

In this paper we address the problem of clustering closed-loop consensus networks where the closed-loop controller is designed using a class of Linear Quadratic Regulator (LQR). Given any positive integer r, our objective is to develop a strategy for grouping the states of the n-node network into r ≤ n distinct non-overlapping groups. The criterion for this partitioning is defined as follows. F...

متن کامل

A Review on Consensus Algorithms in Blockchain

Block chain technology is a decentralized data storage structure based on a chain of data blocks that are related to each other. Block chain saves new blocks in the ledger without trusting intermediaries through a competitive or voting mechanism. Due to the chain structure or the graph between each block with its previous blocks, it is impossible to change blocking data. Block chain architectur...

متن کامل

Vertex sparsification and universal rounding algorithms

Suppose we are given a gigantic communication network, but are only interested in a small number of nodes (clients). There are many routing problems we could be asked to solve for our clients. Is there a much smaller network that we could write down on a sheet of paper and put in our pocket that approximately preserves all the relevant communication properties of the original network? As we wil...

متن کامل

Bayesian Sparsification of Recurrent Neural Networks

Recurrent neural networks show state-of-theart results in many text analysis tasks but often require a lot of memory to store their weights. Recently proposed Sparse Variational Dropout (Molchanov et al., 2017) eliminates the majority of the weights in a feed-forward neural network without significant loss of quality. We apply this technique to sparsify recurrent neural networks. To account for...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Electronics

سال: 2021

ISSN: 2079-9292

DOI: 10.3390/electronics10091082